China Telecom Opensources the National Large-Scale MoE Model TeleChat3! Full-Stack Self-R&D, Trained on 15T Tokens, Supports Thinking Mode to Compete with International Top-Level Models
China Telecom's Artificial Intelligence Research Institute opensources the Star Language Large Model TeleChat3 series, including a trillion-parameter MoE model and a dense architecture model. This series is fully trained based on domestic supercomputing pools, with a data scale of 15 trillion tokens, achieving full-stack localization, marking a key breakthrough in the independent control of large-scale AI models in China.